Y

YouLibs

Remove Touch Overlay

Neural Network Gradient Descent | Activation Functions | Machine Learning by Debabrata Bhakat [EP 8]

Duration: 18:59Views: 298Likes: 6Date Created: Jul, 2021

Channel: Learn By Watch

Category: Education

Tags: activation functionssoftmax functionrelu functionsigmoid functionmachine learning coursesoftmax activation functionactivation function in neural networkrelu activation functionneural networksigmoid activation functiontypes of activation function in neural networktanh activation functionneural network gradient descent algorithmtanh functionmachine learning tutorialsigmoid function neural networkgradient descent neural networkgradient descent

Description: Gradient Descent is use in Neural network because it is a process that occurs in the backpropagation phase where the goal is to continuously resample the gradient of the model's parameter in the opposite direction based on the weight w, updating consistently until we reach the global minimum of function J(w). In this video you will learn about these topics: ● Recap - Recap of how a single neuron works, deep neural network, dimensions and also the forward propagation since those will be needed in gradient descent. Also looked at what an activation function is. ● Tanh activation function is used when we want the output between -1 and 1. It is a popular activation function and generally used in the hidden layers ● ReLu activation function is used to introduce non-linearity and this is also used in the hidden layers. The most widely used activation function. ● Sigmoid activation function is used in the final layer when we classify based on two classes. ● Softmax activation function is also used in the final layer but when we want to classify into more than two classes. ● Gradient descent - Looked at the gradient descent steps and the mathematics behind how it is calculated. ● Random initialisation - If the initialisation of all our parameters to zero then there is no difference between the neurons in a layer. Therefore, random initialization is important to break the symmetry.

Swipe Gestures On Overlay